Supplementary Materials: A Two-Streamed Network for Estimating Fine-Scaled Depth Maps from Single RGB Images
نویسندگان
چکیده
We show examples 3D reconstructions resulting from the depth estimates generated by our proposed method. We sort the 654 test images according to the RMS error and show 20 scenes each with the lowest (Figure 2,3), medium (Figure 4,5) and highest (Figure 6,7) error. The accuracy of our method according to the RMS error aligns roughly with the depth range in the image. For example, most depths in 20 lowest error scenes are smaller than 6m; the medium error scenes have depths limited to approximately 8m, while the highest error scenes have very large depths which exceed 10m, which is also the limit of the Kinect sensor. However, even in the highest error scenes, we reconstruct plausible looking 3D scenes that preserve the overall structure. Resulting errors are rather associated with inaccuracies in the overall depth scale of the scene. We compare our two fusion methods to the state-of-theart ResNet-50 results from Laina et al. [1]. The numerical evaluation in [1] reports a higher accuracy than us, but we find their 3D projections to be distorted and suffer from more artifacts. Often, structures are not unidentifiable and the entire reconstructed 3D surface seems to suffer from grid-like artifacts, possibly due to their up-projection methodology. There is little difference in the 3D projections between our two fusion methods; detailing from the optimization are at times a bit sharper than the end-to-end training.
منابع مشابه
Out-of-focus: Learning Depth from Image Bokeh for Robotic Perception
In this project, we propose a novel approach for estimating depth from RGB images. Traditionally, most work uses a single RGB image to estimate depth, which is inherently difficult and generally results in poor performance – even with thousands of data examples. In this work, we alternatively use multiple RGB images that were captured while changing the focus of the camera’s lens. This method l...
متن کاملSparse-to-Dense: Depth Prediction from Sparse Depth Samples and a Single Image
We consider the problem of dense depth prediction from a sparse set of depth measurements and a single RGB image. Since depth estimation from monocular images alone is inherently ambiguous and unreliable, to attain a higher level of robustness and accuracy, we introduce additional sparse depth samples, which are either acquired with a low-resolution depth sensor or computed via visual Simultane...
متن کاملDepth CNNs for RGB-D scene recognition: learning from scratch better than transferring from RGB-CNNs
Scene recognition with RGB images has been extensively studied and has reached very remarkable recognition levels, thanks to convolutional neural networks (CNN) and large scene datasets. In contrast, current RGB-D scene data is much more limited, so often leverages RGB large datasets, by transferring pretrained RGB CNN models and fine-tuning with the target RGB-D dataset. However, we show that ...
متن کاملDepth Pooling Based Large-scale 3D Action Recognition with Convolutional Neural Networks
This paper proposes three simple, compact yet effective representations of depth sequences, referred to respectively as Dynamic Depth Images (DDI), Dynamic Depth Normal Images (DDNI) and Dynamic Depth Motion Normal Images (DDMNI), for both isolated and continuous action recognition. These dynamic images are constructed from a segmented sequence of depth maps using hierarchical bidirectional ran...
متن کاملRevisiting Single Image Depth Estimation: Toward Higher Resolution Maps with Accurate Object Boundaries
We revisit the problem of estimating depth of a scene from its single RGB image. Despite the recent success of deep learning based methods, we show that there is still room for improvement in two aspects by training a deep network consisting of two sub-networks; a base network for providing an initial depth estimate, and a refinement network for refining it. First, spatial resolution of the est...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017